Programme for International Student Assessment

Programme for International Student Assessment
Abbreviation PISA
Formation 1997
Purpose/focus Comparison of education attainment across the world
Headquarters OECD Headquarters
Location 2 rue André Pascal, 75775 Paris Cedex 16
Region served World
Membership 59 government education departments
Head of the Indicators and Analysis Division Andreas Schleicher
Main organ PISA Governing Body (Chair - Lorna Bertrand, England)
Parent organization OECD
Website PISA

The Programme for International Student Assessment (PISA) is a worldwide evaluation in OECD member countries (currently there are 65 member nations) of 15-year-old school pupils' scholastic performance, performed first in 2000 and repeated every three years. It is coordinated by the Organisation for Economic Co-operation and Development (OECD), with a view to improving educational policies and outcomes. Another similar study is the Trends in International Mathematics and Science Study, which focuses on mathematics and science but not reading.

Contents

Framework

PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).

PISA aims at testing literacy in three competence fields: reading, mathematics, science.

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in various real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and life-long learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling". Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts"[1]

Programme for International Student Assessment (2009)[2]
(Top 10; OECD members as of the time of the study in boldface)
Maths Sciences Reading
1. Shanghai, China 600
2.  Singapore 562
3.  Hong Kong, China 555
4.  South Korea 546
5.  Taiwan 543
6.  Finland 541
7.  Liechtenstein 536
8.  Switzerland 534
9.  Japan 529
10.  Canada 527
1. Shanghai, China 575
2.  Finland 554
3.  Hong Kong, China 549
4.  Singapore 542
5.  Japan 539
6.  South Korea 538
7.  New Zealand 532
8.  Canada 529
9.  Estonia 528
10.  Australia 527
1. Shanghai, China 556
2.  South Korea 539
3.  Finland 536
4.  Hong Kong, China 533
5.  Singapore 526
6.  Canada 524
7.  New Zealand 521
8.  Japan 520
9.  Australia 515
10.  Netherlands 508

Development and implementation

Developed from 1997, the first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook took only place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years.

Every period of assessment focusses on one of the three competence fields reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading is again the main domain in 2009.

Period Main focus # OECD countries # other countries # students Notes
2000 Reading 28 4 265,000 The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27
2009 Reading 34 33? Results made available on 7 December 2010 [3]

PISA is sponsored, governed, and coordinated by the OECD. The test design, implementation, and data analysis is delegated to an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). ACER leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA's Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data. The source code of the data analysis software is not made public.

Method of testing

Sampling

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006 , however, several countries also used a grade-based sample of students. This made it possible also to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are less than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.

The test

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.

In selected countries, PISA started also experimentation with computer adaptive testing.

National add-ons

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: on the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in both the international and the national test, another 45,000 take only the latter. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[4]

Data Scaling

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be scaled to allow meaningful comparisons. This scaling is done using the Rasch model of item response theory (IRT). According to IRT, it is not possible to assess the competence of students who solved none or all of the test items. This problem is circumvented by imposing a Gaussian prior probability distribution of competences.[5]

One and the same scale is used to express item difficulties and student competences. The scaling procedure is tuned such that the a posteriori distribution of student competences, with equal weight given to all OECD countries, has mean 500 and standard deviation 100.

Results

Historical league tables

All PISA results are broken down by countries. Public attention concentrates on just one outcome: achievement mean values by countries. These data are regularly published in form of "league tables".

The following table gives the mean achievements of OECD member countries in the principal testing domain of each period:[6]

In the official reports, country rankings are communicated in a more elaborate form: not as lists, but as cross tables, indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.

In some popular media, test results from all three literacy domains have been consolidated in an overall country ranking. Such meta-analysis is not endorsed by the OECD. The official reports only contain domain-specific country scores. In part of the official reports, however, scores from a period's principal testing domain are used as proxy for overall student ability.[7]

2000–2006

Top results for the main areas of investigation of PISA, in 2000, 2003 and 2006.

2000[8] 2003 2006
Reading literacy Mathematics Science
1.  Finland 546
2.  Canada 534
3.  New Zealand 529
4.  Australia 528
5.  Ireland 527
6.  South Korea 525
7.  United Kingdom 523
8.  Japan 522
9.  Sweden 516
10.  Austria 507
11.  Belgium 507
12.  Iceland 507
13.  Norway 505
14.  France 505
15.  United States 504
16.  Denmark 497
17.  Switzerland 494
18.  Spain 493
19.  Czech Republic 492
20.  Italy 487
21.  Germany 484
22.  Hungary 480
23.  Poland 479
24.  Greece 474
25.  Portugal 470
26.  Luxembourg 441
27.  Russia 462
28.  Latvia 458
29.  Mexico 422
30.  Brazil 396
1.  Finland 544
2.  South Korea 542
3.  Netherlands 538
4.  Japan 534
5.  Canada 532
6.  Belgium 529
7.  Switzerland 527
8.  Australia 524
9.  New Zealand 523
10.  Czech Republic 516
11.  Iceland 515
12.  Denmark 514
13.  France 511
14.  Sweden 503
15.  Austria 506
16.  Germany 503
17.  Ireland 503
18.  Slovakia 498
19.  Norway 495
20.  Luxembourg 493
21.  Poland 490
22.  Hungary 490
23.  Spain 485
24.  United States 483
25.  Italy 466
26.  Portugal 466
27.  Greece 445
28.  Turkey 423
29.  Mexico 385
1.  Finland 563
2.  Canada 534
3.  Japan 531
4.  New Zealand 530
5.  Australia 527
6.  Netherlands 525
7.  South Korea 522
8.  Germany 516
9.  United Kingdom 515
10.  Czech Republic 513
11.  Switzerland 512
12.  Austria 511
13.  Belgium 510
14.  Ireland 508
15.  Hungary 504
16.  Sweden 503
17.  Poland 498
18.  Denmark 496
19.  France 495
20.  Iceland 491
21.  United States 489
22.  Slovakia 488
23.  Spain 488
24.  Norway 487
25.  Luxembourg 486
26.  Italy 475
27.  Portugal 474
28.  Greece 473
29.  Turkey 424
30.  Mexico 410
  1. ^ Chapter 2 of the publication "PISA 2003 Assessment Framework", pdf
  2. ^ Official PISA site data. For list See "Executive Summary"
  3. ^ http://www.oecd.org/document/34/0,3343,en_2649_35845621_44949730_1_1_1_1,00.html
  4. ^ C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007 [1]
  5. ^ The scaling procedure is described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. It is similar to procedures employed in NAEP and TIMSS. According to J. Wuttke Die Insignifikanz signifikanter Unterschiede. (2007, in German), the description in the Technical Reports is incomplete and plagued by notational errors.
  6. ^ OECD (2001) p. 53; OECD (2004a) p. 92; OECD (2007) p. 56.
  7. ^ E.g. OECD (2001), chapters 7 and 8: Influence of school organization and socio-economic background upon performance in the reading test. Reading was the main domain of PISA 2000.
  8. ^ http://www.mpib-berlin.mpg.de/Pisa/PISA-2000_Overview.pdf


2006

Programme for International Student Assessment (2006)
(OECD member countries in boldface)
Maths Sciences Reading
1.  Taiwan 549
2.  Finland 548
3.  Hong Kong 547
3.  South Korea 547
5.  Netherlands 531
6.  Switzerland 530
7.  Canada 527
8.  Macau 525
8.  Liechtenstein 525
10.  Japan 523
1.  Finland 563
2.  Hong Kong 542
3.  Canada 534
4.  Taiwan 532
5.  Estonia 531
5.  Japan 531
7.  New Zealand 530
8.  Australia 527
9.  Netherlands 525
10.  Liechtenstein 522
1.  South Korea 556
2.  Finland 547
3.  Hong Kong 536
4.  Canada 527
5.  New Zealand 521
6.  Ireland 517
7.  Australia 513
8.  Liechtenstein 510
9.  Poland 508
10.  Sweden 507

Top 10 countries for Pisa 2006 results in Math, Sciences and Reading.

2009

Programme for International Student Assessment (2009)[1]
(OECD members as of the time of the study in boldface)
Maths Sciences Reading
1. Shanghai, China 600
2.  Singapore 562
3.  Hong Kong, China 555
4.  South Korea 546
5.  Taiwan 543
6.  Finland 541
7.  Liechtenstein 536
8.  Switzerland 534
9.  Japan 529
10.  Canada 527
11.  Netherlands 526
12.  Macau, China 525
13.  New Zealand 519
14.  Belgium 515
15.  Australia 514
16.  Germany 513
17.  Estonia 512
18.  Iceland 507
19.  Denmark 503
20.  Slovenia 501
21.  Norway 498
22.  France 497
23.  Slovakia 497
24.  Austria 496
25.  Poland 495
26.  Sweden 494
27.  Czech Republic 493
28.  United Kingdom 492
29.  Hungary 490
30.  United States 487
:
65.  Kyrgyzstan 331
1. Shanghai, China 575
2.  Finland 554
3.  Hong Kong, China 549
4.  Singapore 542
5.  Japan 539
6.  South Korea 538
7.  New Zealand 532
8.  Canada 529
9.  Estonia 528
10.  Australia 527
11.  Netherlands 522
12.  Liechtenstein 520
13.  Germany 520
14.  Taiwan 520
15.  Switzerland 517
16.  United Kingdom 514
17.  Slovenia 512
18.  Macau, China 511
19.  Poland 508
20.  Ireland 508
21.  Belgium 507
22.  Hungary 503
23.  United States 502
24.  Norway 500
25.  Czech Republic 500
26.  Denmark 499
27.  France 498
28.  Iceland 496
29.  Sweden 495
30.  Latvia 494
:
65.  Kyrgyzstan 330
1. Shanghai, China 556
2.  South Korea 539
3.  Finland 536
4.  Hong Kong, China 533
5.  Singapore 526
6.  Canada 524
7.  New Zealand 521
8.  Japan 520
9.  Australia 515
10.  Netherlands 508
11.  Belgium 506
12.  Norway 503
13.  Estonia 501
14.  Switzerland 501
15.  Poland 500
16.  Iceland 500
17.  United States 500
18.  Liechtenstein 499
19.  Sweden 497
20.  Germany 497
21.  Ireland 496
22.  France 496
23.  Taiwan 495
24.  Denmark 495
25.  United Kingdom 494
26.  Hungary 494
27.  Portugal 489
28.  Macau, China 487
29.  Italy 486
30.  Latvia 484
:
65  Kyrgyzstan 314

Top 30 countries for Pisa 2009 results in Maths, Sciences and Reading. For a complete list, see reference.

Comparison with other studies

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. Western countries perform slightly better in PISA; Eastern European and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[2]

Topical studies

An evaluation of the 2003 results showed that countries that spent more on education did not necessarily do better. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, South Korea, New Zealand and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.

Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.

It has been suggested that the Finnish language plays an important part in Finland's PISA success.[3]

International testing, including both PISA and TIMSS, has been a central part of many recent analyses of how cognitive skills relate to economic outcomes. These studies consider both individual earnings and aggregate growth differences of nations.[4]

In 2010, the 2009 Program for International Student Assessment (PISA) results revealed that Shanghai students scored the highest in the world in every category (Math, Reading and Science). The OECD described Shanghai as a pioneer of educational reform, noting that "there has been a sea change in pedagogy". OECD point out that they "abandoned their focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving."[5]

Reception

For many countries, the first PISA results were surprising; in Germany and the United States, for example, the comparatively low scores brought on heated debate about how the school system should be changed. Some headlines in national newspapers, for example, were:

The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for "Superman".[6]

Research on causes of country differences

PISA, TIMSS and PIRLS, their organizers and researchers, are restrained in giving reasons for the large and stable country differences. Cautiously they leave this task to other researchers, especially from the economic sciences and psychology. Economic researchers studied single educational policy factors like central exams (John Bishop),[7] private schools or streaming between schools at later age (Hanushek/Woessman).[8] An extensive literature related to cross-countries difference in scores has developed since 2000.[9]

The stable, good results of Finland have attracted a lot of attention. According to Hannu Simola[10] the results are not due to attributes of the educational system, but are due to disciplined students, the respected status of teachers (attracting good students to the teaching profession), high quality of teachers due to professional teacher education, conservative direct instruction ("teaching ex cathedra", "pedagogical conservatism"), low rates of immigration, fast diagnosis of learning problems and treatment of them including special schools, and the culture of a small border country (as in Singapore and Taiwan) feeling that the people could survive only with effort. Others have suggested that Finland's low poverty rate is a reason for its success.[11][12]

Systematic analyses across different paradigms (culture, genes, wealth, educational policies) for 78 countries were presented by Heiner Rindermann and Stephen Ceci[13]: They report positive relationships between student ability and educational levels of adults, amount and rate of preschool education, discipline, quantity of institutionalized education, attendance at additional schools, early tracking and the use of central exams and tests. Rather negative relationships were found with high repetition rates, late school enrollment and large class sizes. In their opinion the results suggest that international differences in cognitive competence could be narrowed by reforms in educational policy.

Criticism

United States

Critics, such as Mel Riddile say that low performance in the United States is closely related to American poverty, but the same reasoning applies to other countries.[11][12] Riddile also shown that when adjusted for poverty, the richest areas in the US, especially areas with less than 10% poverty can perform an average PISA score of 551 (higher than any other country).[12] In essence, the criticism isn't so much directly against the Programme for International Student Assessment itself, but against people who use PISA data uncritically to justify measures such as Charter schools.[14]

The table below summarizes the scores of American schools by their poverty rates and compares them to countries with similar poverty rates.[12]

Country Poverty Rate PISA score
United States < 10% 551
Finland 3.4% 536
Netherlands 9.0% 508
Belgium 6.7% 506
Switzerland 6.8% 501
United States 10%–24.9% 527
Canada 13.6% 524
New Zealand 16.3% 521
Japan 14.3% 520
Australia 11.6% 515
United States 25–49.9% 502
Estonia 501
Poland 14.5% 500
United States 50–74.9% 471
Austria 13.3% 471
Turkey 464
Chile 449
United States > 75% 446
Mexico 425
NASSP

Performance of U.S. states in international comparisons

Given the wide variation in performance of students in different states within the United States, several comparisons have been made by calibrating international assessments to assessments in the United States. The U.S. has regularly tested students in mathematics and reading for individual states since the early 1990s in its National Assessment of Educational Progress (NAEP). Two studies have linked the performance in individual states to national scores on PISA. In the first, comparisons were made between those scoring at the advanced level in mathematics and reading according to NAEP with the corresponding performance on PISA for the "High School Class of 2009."[15] Overall, 30 nations did better than the U.S. in producing students at the advanced level of mathematics. Six percent of U.S. students were advanced in mathematics compared to 28 percent in Taiwan. Perhaps more significantly, the highest ranked state in the U.S. (Massachusetts) was just seventeenth in the world rankings. In the second, U.S. students in the "Class of 2011" who were proficient on the NAEP in mathematics (32 percent) ranked thirty-second among the nations participating in PISA.[16] Massachusetts was again the best U.S. state, but it ranked just ninth in the world.

Comparisons with results for the Trends in International Mathematics and Science Study (TIMSS) appear to give different results -- suggesting that the U.S. states actually do better in world rankings.[17] The difference in apparent rankings is, however, almost entirely accounted for by the sampling of countries. PISA includes all of the OECD countries, while TIMSS is much more weighted in its sampling toward developing countries.

China

Education professor Yong Zhao has said the high scores in China are due to an excessive workload and testing, and added that it's "no news that the Chinese education system is excellent in preparing outstanding test takers, just like other education systems within the Confucian cultural circle—Singapore, Korea, Japan, and Hong Kong."[18] Zhao also noted that most major Chinese media outlets did not pay much attention to this story.[18] Others have criticized Shanghai as an outlier among China, while most of the country has a lower quality of education.[12] According to Chinese state-media, Xinhua News Agency's report, there are 18.9 million students enrolled in four-year higher education institutions (non-vocational, etc.) in 2007, making the figure roughly 2% of the total Chinese population.[19]

Mexico

Some educators in Mexico have taken issue with the methods in which the PISA was performed in Mexico. Municipalities in Mexico with the highest educational performance rankings, Monterrey, Queretaro, Guadalajara, Merida, Puebla and Mexico City were excluded from partaking in the PISA test as local PISA officials concluded that the quality of education in these municipalities were substantially higher than the national average or that of rural municipalities throughout the country and that it would skew the overall test results. Critics of this decision have repeatedly raised the point that these 6 municipalities should be included into the PISA testing circuit as they combined make up a full 30% of the Mexican population and these few municipalities are the driving educational, economic and social forces of Mexico.

Luxembourg

Criticism has ensued in Luxembourg, which scored quite low, over the method used in its PISA test. Although being a trilingual country (Luxembourgish, French and German), in 2000 the test was not allowed to be done in Luxembourgish, the mother tongue of the majority of students (77%).

Portugal

According to OECD's PISA, the average Portuguese 15-years old student was for many years underrated and underachieving in terms of reading literacy, mathematics and science knowledge in the OECD, nearly tied with the Italian and just above those from countries like Greece, Turkey and Mexico. However, since 2010, PISA results for Portuguese students improved dramatically. The Portuguese Ministry of Education announced a 2010 report published by its office for educational evaluation GAVE (Gabinete de Avaliação do Ministério da Educação) which criticized the results of PISA 2009 report and claimed that the average Portuguese teenage student had profund handicaps in terms of expression, communication and logic, as well as a low performance when asked to solve problems. They also claimed that those fallacies are not exclusive of Portugal but indeed occur in other countries due to the way PISA was designed.[20]

See also

References

  1. ^ Official PISA site data. For list See "Executive Summary"
  2. ^ M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March, 2008. [2].
  3. ^ Why does Finnish give better PISA results?
  4. ^ Eric A. Hanushek, and Ludger Woessmann. 2008. "The role of cognitive skills in economic development." Journal of Economic Literature 46, no. 3 (September): 607-668.
  5. ^ Peter Gumbel (Paris). "China Beats Out Finland for Top Marks in Education". TIME. http://www.time.com/time/world/article/0,8599,2035586,00.html#ixzz17XACd2S2. 
  6. ^ "Waiting for "Superman" trailer". http://www.youtube.com/watch?v=ZKTfaro96dg. Retrieved October 8, 2010. 
  7. ^ Bishop, J. H. (1997). The effect of national standards and curriculum-based exams on achievement. American Economic Review, 87, 260-264.
  8. ^ Hanushek, E. A. & Woessmann, L. (2006). Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries. Economic Journal, 116, C63-C76.
  9. ^ Hanushek, Eric A., and Ludger Woessmann. 2011. "The economics of international differences in educational achievement." In Handbook of the Economics of Education, Vol. 3, edited by Eric A. Hanushek, Stephen Machin, and Ludger Woessmann. Amsterdam: North Holland: 89-200.
  10. ^ Simola, H. (2005). The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education. Comparative Education, 41, 455-470.
  11. ^ a b "The Economics Behind International Education Rankings" National Educational Association
  12. ^ a b c d e "PISA: It's Poverty Not Stupid" National Association of Secondary School Principals
  13. ^ Rindermann, H. & Ceci, S. J. (2009). Educational policy and country outcomes in international cognitive competence studies. Perspectives on Psychological Science, 4, 551-577.
  14. ^ Joanne Barkan (winter 2011). "Got Dough? How Billionaires Rule Our Schools". Dissent magazine. http://dissentmagazine.org/article/?article=3781. Retrieved January 28, 2011. 
  15. ^ Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann (2011) "Teaching math to the talented." Education Next 11, no. 1 (Winter): 10-18.
  16. ^ Paul E. Peterson, Ludger Woessmann, Eric A. Hanushek, and Carlos X. Lastra-Anadón (2011) "Are U.S. students ready to compete? The latest on each state’s international standing." Education Next 11, no. 4 (Fall): 51-59.
  17. ^ Gary W. Phillips (2007) Chance favors the prepared mind: Mathematics and science indicators for comparing states. Washington: American Institutes for Research (November 14); Gary W. Phillips (2009) The Second Derivative:International Benchmarks in Mathematics For U.S. States and School Districts. Washington, DC: American Institutes for Research (June).
  18. ^ a b "A True Wake-up Call for Arne Duncan: The Real Reason Behind Chinese Students Top PISA Performance", Yong Zhao
  19. ^ "FACTBOX: Education in China", Xinhua News Agency
  20. ^ (Portuguese) Estudo do ministério aponta graves problemas aos alunos portugueses, GAVE (Gabinete de Avaliação do Ministério da Educação) 2010 report in RTP

Further reading

Official websites and reports

About reception and political consequences

Criticism

External links

Video clips